Skip to content

MegEngine/MegCC

Repository files navigation

logo

Chinese README

What is MegCC

MegCC is a deep-learning model compiler with the following features:

  • Extremely Lightweight Runtime: Only keep the required computation kernel in your binary. e.g., 81KB runtime for MobileNet v1
  • High Performance: Every operation is carefully optimized by experts
  • Portable: generate nothing but computation code, easy to compile and use on Linux, Android, TEE, BareMetal
  • Low Memory Usage while Boot Instantly: Model optimization and memory planning are generated at compile time. Get State-of-the-art level memory usage and spend no extra CPU during inference

MegCC Structure

megcc_struct

MegCC compiler is developed based on MLIR infrastructure. Most of the code generated by the compiler is optimized by hand. MegCC supports neural networks that contain tensors in static shape or dynamic shape. To help achieve the minimum binary size, it also supports generating the necessary CV operators so that you don't need to link another giant CV lib.

When compiling a model:

  • MegCC generates both the kernels used by the model and user-required CV kernels
  • MegCC does several optimizations, such as static memory planning and model optimization
  • MegCC dumps the data above into the final model

MegCC runtime loads the model and uses the generated kernels to finish the model inference. Only 81KB binary size is required to inference MobileNetV1 (in fp32).

MegCC supports Arm64/ArmV7/X86/BareMatal backend. You may want to check supported operator lists.

Documentation

Get MegCC

How to use MegCC

  • Read how-to-use to see how to compile your models and deploy them,also there is a Chinese doc 如何使用.
  • MegCC runtime is easy to run in standard OS, even no OS(example).

License

MegCC is licensed under the Apache License, Version 2.0

Thanks a lot, please enjoy it